AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Large-scale speech pre-training

# Large-scale speech pre-training

Japanese Hubert Base
Apache-2.0
Japanese HuBERT base model trained by rinna Co., Ltd., based on approximately 19,000 hours of Japanese speech corpus ReazonSpeech v1.
Speech Recognition Transformers Japanese
J
rinna
4,550
68
Hubert Large Ll60k
Apache-2.0
HuBERT is a self-supervised speech representation learning model that provides aligned target labels for BERT-like prediction loss through offline clustering steps, suitable for speech recognition, generation, and compression tasks.
Speech Recognition Transformers English
H
facebook
30.99k
28
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase